![]() MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL
专利摘要:
A mobile terminal including a wireless communication unit (110) for performing wireless communication; a camera (121) for taking an image; a display unit (151) for displaying a preview image obtained via the camera (121); and a control apparatus (180) for controlling the display unit (151) to operate in a first state in which a graphic object relating to an image capture function is displayed overlapping the preview image and a second state in which the graphic object is not displayed while the preview image is displayed, according to a user request, and when a first preset type of touch is detected in a region on wherein the preview image is displayed in the second state, controlling the camera (121) to capture the preview image according to the preset type of touch. 公开号:FR3021133A1 申请号:FR1554386 申请日:2015-05-15 公开日:2015-11-20 发明作者:Kyungmin Cho;Seongik Jeon;Minah Song;Chansoo Kim;Seoyong Park;Jeonghyun Lee 申请人:LG Electronics Inc; IPC主号:
专利说明:
[0001] The present invention relates to a mobile terminal having an image capturing function and a method of controlling said mobile terminal. The terminals can be generally classified as mobile / portable terminals or stationary terminals. Mobile terminals can also be classified as pocket terminals or on-vehicle terminals. In addition, mobile terminals have become increasingly functional. Examples of such functions include data and voice communications, image and video capture via a camera, audio recording, playback of music files via a speaker system, and displaying images and video on a display screen. Some mobile terminals include additional functionality such as using a game, while other terminals are configured as media players. [0002] A user interface environment is also provided allowing users to search or select functions easily and conveniently. Also, recently, as the resolution and functions of cameras in mobile terminals have improved, the use of cameras in mobile terminals has increased. However, the functions and camera interface of the camera are limited and sometimes disturb the user. Therefore, an object of the present invention is to address the above-noted and other problems of the related art. Another object of the present invention is to provide a mobile terminal and a corresponding method for providing a graphical user interface (GUI) related to the optimized image capture. Another object of the present invention is to provide a mobile terminal and a corresponding method for providing an image capture function where the user can simply touch a preview image. [0003] To achieve these and other advantages and in accordance with the purpose of this memo, as realized and generally described herein, the present invention provides, in one aspect, a mobile terminal including a wireless communication unit. configured to perform wireless communication; an image capture apparatus configured to obtain an image; a display unit configured to display a preview image obtained through the camera; and a control apparatus configured to control the display unit to operate in any one of a first state in which a graphic object relating to an image capture function is displayed overlapping the preview image and a second state in which the graphic object is not displayed while the preview image is displayed, according to a user request, and when a first preset type of touch is detected in a region on which the preview image is displayed in the second state, control the camera to capture the preview image according to the preset type of touch. In another aspect, the present invention provides a method of controlling a mobile terminal, which includes displaying, through a display unit of the mobile terminal, a preview image obtained by the intermediate of a camera of the mobile terminal; controlling, via a mobile terminal controller, the display unit to operate in any one of a first state in which a graphic object relating to an image capture function is displayed overlapping the preview image and a second state in which the graphic object is not displayed while the preview image is displayed, according to a user request; and when a first preset type of touch is detected in a region on which the preview image is displayed in the second state, the control, via the control apparatus, of the camera of capture of views to capture the preview image based on the preset type of touch. In addition, the scope of applicability of the present application will become more apparent from the detailed description provided hereinafter. However, it should be understood that the detailed description and the specific examples, while indicating preferred embodiments of the invention, are provided by way of illustration only, as various changes and modifications within the mind and The scope of the invention will become apparent to those skilled in the art from the detailed description. [0004] The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments and together with the description serve to explain the principles of the invention. [0005] In the drawings: FIG. 1A is a block diagram of a mobile terminal according to an embodiment of the present invention; FIGS. 1B and 1C are conceptual views of an example of the mobile terminal, viewed from different directions; Figure 2 is a conceptual view of a mobile terminal according to an embodiment of the present invention; Fig. 3 is a flowchart illustrating a method of controlling a mobile terminal according to an embodiment of the present invention; Fig. 4 is a conceptual view illustrating the control method of Fig. 3; Figs. 5A to 5D are conceptual views illustrating a method of controlling an image capture function in various ways according to various touches; Figs. 6A-6G are conceptual views illustrating a focus adjustment process with respect to a preview image using touches applied to a display unit; Figs. 7A-7C are conceptual views illustrating a second state permutation embodiment in which the output of a graphics object is limited to a first state in which a graphics object is output; FIGS. 8A-8C are conceptual views illustrating another embodiment of permutation of a second state in which the output of a graphic object is limited to a first state in which a graphic object is output; Figs. 9A-9E are conceptual views illustrating a method of controlling captured images in the second state in which the output of a graphics object is limited; and Figs. 10A to 10D are conceptual views illustrating a method of performing an image capture function in the second state in which the output of a graphics object is limited. [0006] A description will now be provided in detail according to embodiments described herein, with reference to the accompanying drawings. For the brief description with reference to the drawings, identical or similar reference numbers may be assigned to the same or equivalent components, and the description of these components will not be repeated. In general, a suffix such as "module" and "unit" can be used to refer to elements or components. The use of such a suffix herein is merely intended to facilitate the description of the memoir, and the suffix itself is not intended to give a special meaning or function. In the present invention, what is well known to those of ordinary skill in the relevant art has generally been omitted for brevity. The mobile terminals exposed herein may be implemented using a variety of different types of terminals. Examples of such terminals include cell phones, smart phones, user equipment, laptops, digital broadcast terminals, personal digital assistants (PDAs), portable media players (PMPs), browsers, laptops (PC), slate PCs, tablet PCs, ultra-portable computers, portable devices (eg, smart watches, smart glasses, helmets (HMD)), and the like. By way of nonlimiting example only, an additional description will be made by referring to particular types of mobile terminals. However, such teachings apply equally to other types of terminals, such as the types noted above. In addition, these teachings can also be applied to stationary terminals such as digital TV, desktops, digital signage, and the like. Reference is now made to FIGS. 1A to 1C, where FIG. 1A is a block diagram of a mobile terminal in accordance with the present invention, and FIGS. 1B and 1C are conceptual views of an example of the mobile terminal, given from different directions. Referring now to FIG. 1A, the mobile terminal 100 is shown having components such as a wireless communication unit 110, an input unit 120, a detection unit 140, an output unit 150, an input unit interface unit 160, a memory 170, a control apparatus 180, and an electric power supply unit 190. The implementation of all the illustrated components is not a necessity, and more or less components can be implemented alternatively. [0007] The mobile terminal 100 is shown having the wireless communication unit 110 configured with several components currently implemented. For example, the wireless communication unit 110 typically includes one or more components that enable wireless communication between the mobile terminal 100 and a wireless communication system or network within which the mobile terminal is located. As shown in FIG. 1A, the wireless communication unit 110 includes one or more of a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short communication module 114, and a location information module 115. In addition, the input unit 120 includes a camera 121 for obtaining images or video, a microphone 122, which is a type of camera. an audio input for inputting an audio signal, and a user input unit 123 (for example, a touch key, a push button, a mechanical key, a programmable key, and the like) to enable a user to input information. Data (eg, audio, video, image, and the like) is obtained by the input unit 120 and can be analyzed and processed by the controller 180 according to device parameters, user instructions, and associations of these. The detection unit 140 is generally implemented using one or more sensors configured to detect internal information of the mobile terminal, the near-mobile environment, user information, and the like. For example, in FIG. 1A, the detection unit 140 is shown having a proximity sensor 141 and a lighting sensor 142. If desired, the detection unit 140 may alternatively or additionally include d other types of sensors or devices, such as a touch sensor, an acceleration sensor, a magnetic sensor, a gravity sensor, a gyro sensor, a motion sensor, an RGB sensor, an infrared (IR) sensor , a digital scanning sensor, an ultrasonic sensor, an optical sensor (for example, the camera 121), a microphone 122, a battery meter, an environmental sensor (for example, a barometer, a hygrometer, thermometer, radiation detection sensor, thermal sensor, and gas sensor, among others), and a chemical sensor (e.g., electronic nose, health sensor, biometric sensor, and the like) , to name a few. The mobile terminal 100 can be configured to use information obtained from the detection unit 140, and in particular, information obtained from one or more sensors of the detection unit 140, and associations of them. [0008] Output unit 150 is configured to output various types of information, such as audio, video, touch output, and the like. The output unit 150 is shown having a display unit 151, an audio output module 152, a haptic module 153, and an optical output module 154. The display unit 151 may have an inter-layer structure or an integrated structure with a touch sensor to facilitate a touch screen. The touch screen may provide an output interface between the mobile terminal 100 and a user, as well as serve as a user input unit 123 which provides an input interface between the mobile terminal 100 and the user. The interface unit 160 serves as an interface with various types of external devices that can be coupled to the mobile terminal 100. The interface unit 160, for example, can include any of wired or wireless ports, external power supply ports, wired or wireless data ports, memory card ports, ports for connecting a device having an identification module, input / output ports (1/0 ) audio, video 1/0 ports, headset ports, and the like. In some cases, the mobile terminal 100 may perform matching control functions associated with a connected external device, in response to the device that is connected to the interface unit 160. The memory 170 is generally implemented to store data. to support various functions or features of the mobile terminal 100. For example, the memory 170 may be configured to store application programs executed in the mobile terminal 100, data or instructions for operations of the mobile terminal 100, and the like. Some of these application programs can be downloaded from an external server via wireless communication. Other application programs may be installed in the mobile terminal 100 during manufacturing or shipping, which is typically the case for basic functions of the mobile terminal 100 (for example, receiving a call). , making a call, receiving a message, sending a message, and the like). It is common for application programs to be stored in the memory 170, installed in the mobile terminal 100, and executed by the controller 180 to perform an operation (or function) for the mobile terminal 100. To control an application program stored in the memory 170, the control apparatus 180 may control at least some of the components described above with reference to Fig. 1A. In addition, in order to drive the application program, the controller 180 may combine two or more components included in the mobile terminal 100 to operate it. The electric power supply unit 190 may be configured to receive external electrical power or to provide internal electrical power to provide appropriate electrical power required to operate included components and components. in the mobile terminal 100. The power supply unit 190 may include a battery, and the battery may be configured to be incorporated in the terminal body, or configured to be removable from the terminal body. At least some of the components may be cooperatively operated to implement operations, control, or methods of controlling the mobile terminal according to various embodiments described hereinafter. Also, the operations, control, or control methods of the mobile terminal can be implemented in the mobile terminal by driving at least one application program stored in the memory 170. Still referring to FIG. 1A, various components illustrated in this figure will now be described in more detail. With respect to the wireless communication unit 110, the broadcast receiving module 111 is configured to receive a broadcast signal and / or broadcast related information from an external broadcast management entity by the broadcast unit. intermediate of a broadcast channel. The broadcast channel may include a satellite channel, a terrestrial channel, or both. In some embodiments, two or more broadcast receiving modules 111 may be used to facilitate the simultaneous reception of two or more broadcast channels, or to support switching among broadcast channels. The mobile communication module 112 may transmit and / or receive one or more wireless signals and one or more network entities. Typical examples of a network entity include a base station, an external mobile terminal, a server, and the like. Such network entities are part of a mobile communication network, which is built according to technical standards or communication methods for mobile communications (eg global mobile communication system (GSM), distributed division multiple access). code (CDMA), code division multiple access (CDMA 2000), EV-DO, broadband code division multiple access (WCDMA), high-speed downlink packet access (HSDPA), packet access high-speed uplink (HSUPA), long-term evolution (LTE) technology, long-term advanced-evolution (LTE-A) technology, and the like). Examples of wireless signals transmitted and / or received via the mobile communication module 112 include audio calling signals, video calling (telephony) signals, or various data formats for supporting the communication of messages. text and multimedia. [0009] The wireless Internet module 113 is configured to facilitate wireless Internet access. This module can be internally or externally coupled to the mobile terminal 100. The wireless Internet module 113 can transmit and / or receive wireless signals over communication networks using wireless Internet technologies. Examples of such wireless Internet access include Wireless Local Area Network (WLAN), Wi-Fi, Wi-Fi Direct, DLNA, WiBro, WiMAX, High Speed Downlink Packet Access (HSDPA), Link Packet Access high throughput (HSUPA), long-term evolution (LTE) technology, long-term advanced-evolution (LTE-A) technology, and the like. The wireless Internet module 113 may transmit / receive data according to one or more of such wireless Internet technologies, and other Internet technologies as well. In some embodiments, when wireless Internet access is implemented according to, for example, WiBro, HSDPA, HSUPA, GSM, CDMA, WCDMA, LTE, LTE-A and the like, as part of a network mobile communication, the wireless Internet module 113 makes such wireless Internet access. As such, the Internet module 113 may cooperate with, or serve as, a mobile communication module 112. The short-distance communication module 114 is configured to facilitate short-distance communications. Appropriate technologies for implementing such short-range communications include BLUETOOTHTm, Radio Frequency Identification (RFID), IrDA, Ultra Wideband (UWB), ZigBee, Near Field Communication (NFC), Wi-Fi, Wi-Fi Direct, USB wireless, and the like. The short-distance communication module 114 generally supports wireless communications between the mobile terminal 100 and a wireless communication system, communications between the mobile terminal 100 and another mobile terminal 100, or communications between the mobile terminal and the mobile terminal. a network where another mobile terminal 100 (or an external server) is located, via wireless networks. An example of wireless networks is a wireless personal network. [0010] In some embodiments, another mobile terminal (which may be configured similarly to the mobile terminal 100) may be a portable device, for example, a smart watch, smart glasses, or a headset (HMD), which may exchanging data with the mobile terminal 100 (or cooperating otherwise with the mobile terminal 100). The short-distance communication module 114 can detect or recognize the portable device, and allow communication between the portable device and the mobile terminal 100. In addition, when the detected portable device is a device that is authenticated to communicate with the mobile terminal 100, the controller 180, for example, may cause the processed data to be transmitted in the mobile terminal 100 to the portable device via the short-distance communication module 114. Thus, a user of the portable device may use the data processed in the mobile terminal 100 on the portable device. For example, when a call is received in the mobile terminal 100, the user can answer the call using the portable device. Also, when a message is received in the mobile terminal 100, the user can control the received message using the portable device. The location information module 115 is generally configured to detect, calculate, derive or otherwise identify a position of the mobile terminal. For example, the location information module 115 includes a GPS module, a Wi-Fi module, or both. If desired, the location information module 115 may alternatively or additionally operate with any of the other modules of the wireless communication unit 110 to obtain data related to the position of the mobile terminal. For example, when the mobile terminal uses a GPS module, a position of the mobile terminal can be acquired using a signal sent from a GPS satellite. As another example, when the mobile terminal uses the Wi-Fi module, a position of the mobile terminal can be acquired based on information related to a wireless access point (AP) which transmits or receives a signal without The input unit 120 may be configured to allow various types of input to the mobile terminal 120. Examples of such an input include audio, picture, video input, and so on. data, and user. The image and video input is often obtained using one or more cameras 121. Such cameras 121 can process image frames of inanimate images or video obtained by image sensors. in a video mode or image capture. The processed image frames can be displayed on the display unit 151 or stored in the memory 170. In some cases, the cameras 121 can be arranged in a matrix configuration to allow a plurality of images to be displayed. images having various angles or focal points to be input to the mobile terminal 100. As another example, the cameras 121 can be located in a stereoscopic arrangement for acquiring left and right images to implement a stereoscopic image. [0011] The microphone 122 is generally implemented to allow audio input into the mobile terminal 100. The audio input may be processed in a variety of ways according to a function performed in the mobile terminal 100. If desired, the microphone 122 may include matching noise elimination algorithms to eliminate unwanted noise generated during reception of external audio. [0012] The user input unit 123 is a component that allows input by a user. Such user input may enable the controller 180 to control the operation of the mobile terminal 100. The user input unit 123 may include one or more of a mechanical input element (for example, a key, a button located on a front and / or rear surface or a side surface of the mobile terminal 100, a curved switch, a pulse wheel, a pulse switch, and the like), or a touch input, among others. As an example, the touch input may be a virtual key or soft key, which is displayed on a touch screen via software processing, or a touch key that is located on the mobile terminal at a location that is other than the touch screen. In addition, the virtual key or the visual key can be displayed on the touch screen in various forms, for example, a graphic, a text, an icon, a video, or an association thereof. The user input unit 123 may recognize information detected by the detection unit 140, as well as by the aforementioned mechanical input mechanism and touch input mechanism, as an input of information to from a user. Therefore, the controller 180 may control operation of the mobile terminal 100 corresponding to the detected information. The detection unit 140 is generally configured to detect one or more of internal information of the mobile terminal, environment information close to the mobile terminal, user information, or the like. The controller 180 generally cooperates with the detection unit 140 to control the operation of the mobile terminal 100 or to execute a data processing, function or operation associated with an application program installed in the mobile terminal as a function of the detection provided by the detection unit 140. The detection unit 140 may be implemented using any of a variety of sensors, some of which will now be described in more detail. The proximity sensor 141 may include a sensor for detecting the presence or absence of an object approaching a surface, or an object located near a surface, using an electromagnetic field, infrared rays, or the like without mechanical contact. The proximity sensor 141 may be arranged on an inner region of the mobile terminal covered by the touch screen, or near the touch screen. The proximity sensor 141, for example, may include any of a transmitted type photoelectric sensor sif, a specular reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a proximity sensor. capacitance sensor, a magnetic type proximity sensor, an infrared proximity sensor, and the like. When the touch screen is implemented in capacitance form, the proximity sensor 141 can detect the proximity of a pointer to the touch screen by changes in an electromagnetic field, which responds to an approach of an object with conductivity. In this case, the touch screen (touch sensor) can also be categorized as a proximity sensor. The term "touch in proximity" will often be used herein to denote the scenario in which a pointer is positioned to be close to the touch screen without touching the touch screen. The term "contact touch" will often be used herein to denote the scenario in which a pointer physically enters into contact with the touch screen. For the position corresponding to the touch in proximity of the pointer relative to the touch screen, such a position will correspond to a position where the pointer is perpendicular to the touch screen. Proximity sensor 141 can detect touch in proximity, and touch profiles in proximity (e.g., distance, direction, speed, time, position, motion status, and the like). In general, the controller 180 processes data corresponding to proximity touches and proximity touch profiles detected by the proximity sensor 141, and causes visual information to be output to the touch screen. In addition, the controller 180 may control the mobile terminal 100 to perform different operations or process different data depending on whether a touch with respect to a point on the touch screen is a touch in proximity or a touch to contact. [0013] A touch sensor may detect a touch applied to the touch screen, such as the display unit 151, using any of a variety of touch methods. Examples of such tactile methods include a resistive type, a capacitive type, an infrared type, and a magnetic field type, among others. For example, the touch sensor may be configured to convert pressure changes applied to a specific portion of the display unit 151, or to convert a capacitance occurring on a specific portion of the display unit 151. , as electrical input signals. The touch sensor can also be configured to detect not only an affected position and an affected area, but also tactile pressure and / or touch capacitance. A touch object is typically used to apply tactile input to the touch sensor. Examples of typical touch objects include a finger, a touch pen, a stylus, a pointer, or the like. [0014] When a touch input is detected by a touch sensor, corresponding signals can be transmitted to a touch control device. The touch controller 180 can process the received signals, and then transmit corresponding data to the controller 180. Therefore, the controller 180 can detect the region of the display unit 151 which has been affected. [0015] Here, the touch control apparatus 180 may be a separate component of the control apparatus 180, the control apparatus 180, and combinations thereof. In some embodiments, the control apparatus 180 may execute identical or different controls depending on a type of touch object that touches the touch screen or a touch key provided in addition to the touch screen. The fact of executing the same command or a different command depending on the object which provides a touch input can be decided according to a current operating state of the mobile terminal 100 or an application program currently being executed, by example. The touch sensor and the proximity sensor can be used individually, in combination, to detect various types of touch. Such touches include a short touch, a long touch, a multi-touch, a touch-and-shoot, a quick touch, a close-in pinch, a separating pinch, a touch-and-slide, a stationary floating feel, and the like. If desired, an ultrasonic sensor may be implemented to recognize positional information relating to a touch object using ultrasonic waves. The controller 180, for example, can calculate a position of a wave generation source based on information detected by a lighting sensor and a plurality of ultrasonic sensors. Since light is much faster than ultrasonic waves, the time during which the light reaches the optical sensor is much shorter than the time during which the ultrasonic wave reaches the ultrasonic sensor. The position of the wave generation source can be calculated using this fact. For example, the position of the wave generation source can be calculated using the time difference from the time the ultrasonic wave reaches the sensor as a function of light as a reference signal. The camera 121 typically includes at least one of a camera sensor (CCD, CMOS etc.), a photo sensor (or image sensors), and a laser sensor. The implementation of the camera 121 with a laser sensor can allow the detection of a touch of a physical object relative to a stereoscopic 3D image. The photo-sensor can be laminated on, or overlapped with, the display device. The photo sensor can be configured to scan a movement of the physical object near the touch screen. In more detail, the photo-sensor may include photodiodes and phototransistors in rows and columns for scanning content received in the photo-sensor using an electrical signal that changes according to the amount of light applied. Namely, the photo-sensor can calculate the coordinates of the physical object according to a variation of light to thereby obtain position information of the physical object. The display unit 151 is generally configured to output processed information in the mobile terminal 100. For example, the display unit 151 may display run screen information of an application program executed in the mobile terminal 100 or user interface (UI) and graphical user interface (GUI) information in response to the execution screen information. In some embodiments, the display unit 151 may be implemented as a stereoscopic display unit for displaying stereoscopic images. A typical stereoscopic display unit may use a stereoscopic display scheme such as a stereoscopic scheme (a glass scheme), an auto-stereoscopic scheme (without glass scheme), a projection scheme (holographic scheme), or the like . The audio output module 152 is generally configured to output audio data. Such audio data may be obtained from any one of a number of different sources, so that the audio data may be received from the wireless communication unit 110 or may have been stored in the memory 170. The audio data may be output during modes such as a signal receiving mode, a calling mode, a recording mode, a voice recognition mode, a broadcast receiving mode, and the like. The audio output module 152 may provide an audible output related to a particular function (e.g., a call signal receiving sound, a message receiving sound, etc.) performed by the mobile terminal 100. audio output 152 may also be implemented as a receiver, loudspeaker, buzzer, or the like. A haptic module 153 may be configured to generate various tactile effects that a user feels, perceives, or experiences otherwise. A typical example of a tactile effect generated by the haptic module 153 is vibration. The intensity, mode and the like of the vibration generated by the haptic module 153 can be controlled by user selection or setting by the control unit. For example, the haptic module 153 may output different vibrations in a combinatorial or sequential manner. [0016] In addition to the vibration, the haptic module 153 may generate various other tactile effects, including a pacing effect such as a vertically movable hair pin arrangement for contacting the skin, a spraying force or suction force. air through a jet orifice or a suction opening, a touch of the skin, a contact of an electrode, an electrostatic force, in effect by reproducing the sensation of cold and heat by using an element that can absorb or generate heat, and the like. The haptic module 153 can also be implemented to allow the user to feel a tactile effect via a muscular sensation, for example the fingers or the arm of the user, as well as the transfer of the effect. tactile by direct contact. Two or more haptic modules 153 may be provided according to the particular configuration of the mobile terminal 100. An optical output module 154 may output a signal to indicate event generation using light from a light source. Examples of events generated in the mobile terminal 100 may include message reception, call waiting, a missed call, an alarm, a calendar announcement, an email reception, a reception of information via an application, and the like. A signal output by the optical output module 154 may be implemented for the mobile terminal to emit monochrome light or light with a plurality of colors. The output signal can be terminated when the mobile terminal detects that a user has controlled the generated event, for example. The interface unit 160 serves as an interface for external devices to be connected to the mobile terminal 100. For example, the interface unit 160 may receive data transmitted from an external device, receive data from the electrical energy to be transferred to elements or components within the mobile terminal 100, or transmit internal data of the mobile terminal 100 to such an external device. The interface unit 160 may include wired or wireless headset ports, external power supply ports, wired or wireless data ports, memory card ports, ports for connect a device having an identification module, audio input / output (1/0) ports, video 1/0 ports, headset ports, or the like. The identification module may be a chip that stores various information to authenticate the authority to use the mobile terminal 100 and may include a user identity module (UIM), a subscriber identity module (SIM) ), a Universal Subscriber Identity Module (USIM), and the like. In addition, the device having the identification module (also referred to herein as "identification device") may take the form of a smart card. [0017] Therefore, the identification device can be connected with the terminal 100 via the interface unit 160. When the mobile terminal 100 is connected with an external docking station, the interface unit 160 may be used as a passageway to allow electrical energy from the docking station to be supplied to the mobile terminal 100 or may be used as a gateway to allow various order signals entered by the user from the docking station to be transferred to the mobile terminal through it. Various order signals or electrical energy input from the docking station can serve as signals to recognize that the mobile terminal is properly mounted on the docking station. [0018] The memory 170 may store programs to support operations of the controller 180 and store input / output data (eg, phone book, messages, inanimate images, videos, etc.). The memory 170 can store related data to various vibration and audio profiles that are output in response to touch inputs on the touch screen. The memory 170 may include one or more types of storage media including a flash memory, a hard disk, a semiconductor disk, a silicon disk, a micro-type of multimedia card, a card-type memory (for example: example, SD or DX memory, etc.), random access memory (RAM), static random access memory (SRAM), read only memory (ROM), erasable and electrically programmable read only memory (EEPROM), programmable read only memory ( PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. The mobile terminal 100 may also be operated with respect to a network storage device which performs the storage function of the memory 170 via a network, such as the Internet. The controller 180 may typically control the general operations of the mobile terminal 100. For example, the controller 180 may set or release a lock state to prevent a user from entering a control command with respect to applications when a state of the mobile terminal satisfies a pre-set condition. The controller 180 may also perform the associated control and processing with voice calls, data communications, video calls, and the like, or perform pattern recognition processing to recognize a handwriting input or a handwritten input. Drawing input made on the touch screen as characters or images, respectively. In addition, the controller 180 may control a component or combination of these components to implement various embodiments described herein. The electric power supply unit 190 receives external electrical power or provides internal electrical power and provides the appropriate electrical power supply required to operate respective elements and components included in the mobile terminal 100. The power supply unit 190 may include a battery, which is typically rechargeable or removably coupled to the terminal body for charging. The power supply unit 190 may include a connection port. The connection port can be configured as an example of the interface unit 160 to which an external charger for carrying electrical power to recharge the battery is electrically connected. As another example, the power supply unit 190 may be configured to recharge the wireless battery without using the connection port. In the present example, the electric power supply unit 190 can receive electrical energy, transferred from an external wireless electrical energy transmitter, using at least one of an inductive coupling method which is based on a magnetic induction or a magnetic resonance coupling process that is based on electromagnetic resonance. [0019] Various embodiments described herein may be implemented in a computer readable medium, a machine readable medium, or similar medium using, for example, software, hardware, or any combination thereof. Referring now to FIGS. 1B and 1C, the mobile terminal 100 is described with reference to a bar-type terminal body. However, the mobile terminal 100 may be implemented alternatively in any of a variety of different configurations. Examples of such configurations include a watch type, bar type, eyeglass type, or foldable type, flap type, sliding type, toggle type, and swivel type in which two and more bodies are associated with each other in a relatively mobile manner, and associations thereof. The description herein will often refer to a particular type of mobile terminal. The mobile terminal 100 will generally include a housing (eg, frame, housing, cover, and the like) forming the appearance of the terminal. In this embodiment, the housing is formed using a front housing 101 and a rear housing 102. Various electronic components are incorporated in a gap formed between the front housing 101 and the rear housing 102. At least one middle housing can further be positioned between the front housing 101 and the rear housing 102. The display unit 151 is shown located on the front side of the terminal body to output information. As illustrated, a window 151a of the display unit 151 may be mounted on the front housing 101 to form the front surface of the terminal body together with the front housing 101. [0020] In some embodiments, electronic components may also be mounted on the back box 102. Examples of such electronic components include a removable battery 191, an identification module, a memory card, and the like. A rear cover 103 is shown covering the electronic components, and this cover can be detachably coupled to the rear housing 102. Thus, when the rear cover 103 is removed from the rear housing 102, the electronic components mounted on the rear housing 102 are exposed. externally. As illustrated, when the rear cover 103 is coupled to the rear housing 102, a side surface of the rear housing 102 is partially exposed. In some cases, when mating, the rear housing 102 may also be completely protected by the rear cover 103. In some embodiments, the rear cover 103 may include an opening for externally exposing a camera 121b or an audio output module 152b. The housings 101, 102, 103 may be formed by injection molding a synthetic resin or may be formed of a metal, for example, stainless steel (STS), aluminum (Al), titanium (Ti ), or the like. Instead of the example in which the plurality of housings form an interior space for housing components, the mobile terminal 100 may be configured such that a housing forms the interior space. In the present example, a mobile terminal 100 having a uni-body is formed so that synthetic resin or metal extends from a side surface to a back surface. If desired, the mobile terminal 100 may include a water seal unit to prevent the introduction of water into the terminal body. For example, the water seal unit may include a water seal element that is located between the window 151a and the front housing 101, between the front housing 101 and the rear housing 102, or between the rear housing 102 and the rear cover 103, for sealing an interior space when these housings are coupled. [0021] The mobile terminal 100 may include the display unit 151, first and second audio output units 152a and 152b, the proximity sensor 141, a lighting sensor 142, a light output unit 154, first and second cameras 121a and 121b, first and second handling units 123a and 123b, a microphone 122, an interface unit 160, and the like. Hereinafter, as illustrated in FIGS. 1B and 1C, the mobile terminal 100 in which the display unit 151, the first audio output unit 152a, the proximity sensor 141, the illumination sensor 142, the optical output module 154, the first camera 121a, and the first manipulation unit 123a are disposed on a front surface of the terminal body, the second handling unit 123b, the microphone 122, and the camera unit. interface 160 are disposed on the side of the terminal body, and the second audio output unit 152b and the second camera 121b are disposed on a rear surface of the terminal body will be described by way of example. However, the components are not limited to this configuration. Components may be excluded, replaced, or placed on other surfaces as needed. For example, the first handling unit 123a may not be provided on the front surface of the terminal body, and the second audio output unit 152b may be provided on the side of the terminal body, rather than on the back surface of the body terminal. The display unit 151 may display (or output) processed information in the mobile terminal 100. For example, the display unit 151 may display screen information executed from an application program driven in the mobile terminal 100, or user interface information (UI) or graphical user interface (GUI) information according to the screen information executed. The display unit 151 may include a liquid crystal display (LCD) screen, a thin film transistor LCD (TFT25 LCD), an organic light-emitting diode (OLED), a color display screen, and a display panel. flexible display, a three-dimensional (3D) display screen, and an electronic ink display screen. The display unit 151 may be implemented using two display devices, which may use the same technology or different display technology. For example, a plurality of display units 151 may be arranged on one side, spaced from each other, or these devices may be integrated, or these devices may be arranged on different surfaces. [0022] The display unit 151 may also include a touch sensor that detects a touch input received on the display unit. When a touch is input to the display unit 151, the touch sensor can be configured to detect that touch and the controller 180, for example, can generate a control command or other signal corresponding to the touch. Content that is entered in a tactile manner can be a text or numeric value, or a menu item that can be embedded or designated in various modes. The touch sensor may be configured as a film having a touch profile, disposed between the window 151a and a display screen on a rear surface of the window 151a, or a wire which is patterned directly on the back surface of the window. the window 151a. Alternatively, the touch sensor may be formed integrally with the display screen. For example, the touch sensor may be disposed on a substrate of the display screen or within the display screen. The display unit 151 may also form a touch screen together with the touch sensor. Here, the touch screen can serve as a user input unit 123 (see Figure 1A). Thus, the touch screen can replace at least some of the functions of the first handling unit 123a. The first audio output module 152a can be implemented as a receiver and the second audio output unit 152b can be implemented as a speaker to output a voice audio, alarm sounds, audio reproduction. multimedia, and the like. The window 151a of the display unit 151 will typically include an aperture to allow audio generated by the first audio output module 152a to pass. An alternative is to allow audio to be released along an assembly gap between the structural bodies (for example, a gap between the window 151a and the front housing 101). In this case, an independently formed hole for outputting audio sounds may not be seen or otherwise disguised in appearance, further simplifying the appearance and fabrication of the mobile terminal 100. The optical output module 154 may be configured to exit light 30 to indicate event generation. Examples of such events include message reception, call waiting reception, missed call, alarm, calendar announcement, email reception, receipt of information via an application, and the like. When a user has controlled a generated event, the controller may control the optical output unit 154 to stop the light output. The first camera 121a can process image frames such as inanimate or animated images obtained by the image sensor in a capture mode or a video call mode. The processed image frames can then be displayed on the display unit 151 or stored in the memory 170. The first and second handling units 123a and 123b are examples of the user input unit 123, which can be manipulated by user to provide input to the mobile terminal 100. The first and second handling units 123a and 123b may also be commonly referred to as manipulative parts, and may utilize any tactile method that allows the user to perform manipulation. such as touching, pushing, scrolling, or the like. The first and second handling units 123a and 123b may also utilize any non-tactile method that allows the user to perform manipulation such as proximity touch, stationary float, or the like. Figure 1B illustrates the first handling unit 123a as a touch key, but other possibilities include a mechanical key, a push button, a touch key, and combinations thereof. An input received in the first and second handling units 123a and 123b can be used in various ways. For example, the first handling unit 123a can be used by the user to provide menu entry, a home key, a cancel, a search, or the like, and the second handling unit 123b can be used by the user for providing an input for controlling a volume level output from the first or second audio output module 152a or 152b, to switch to a touch recognition mode of the display unit 151, or the like. As another example of the user input unit 123, a rear input unit may be located on the rear surface of the terminal body. The rear input unit may be manipulated by a user to provide input to the mobile terminal 100. The input may be used in a variety of different ways. For example, the rear input unit may be used by the user to provide an input for turning on / off, starting, terminating, scrolling, controlling the volume level output from the first or second audio output module 152a or 152b, switch to a touch recognition mode of the display unit 151, and the like. The rear input unit may be configured to allow touch input, push input, or combinations thereof. The rear input unit may be located to overlap the display unit 151 on the front side in a thickness direction of the terminal body. For example, the rear input unit may be located on a rear end portion of the rear side of the terminal body so that a user can easily manipulate it using the index when the user grasps the terminal body with a hand. However, the present invention is not limited to this and a position of the rear input unit can be changed. When the rear input unit is provided on the rear surface of the terminal body, a new user interface can be implemented. Also, when the touch screen or the rear input unit, as described above, replaces at least some functions of the first handling unit 123a provided on the front surface of the terminal body and therefore the first handling unit 123a is omitted from the front surface of the terminal body, the display unit 151 may have a larger screen. As an additional possibility, the mobile terminal 100 may include a digital scanning sensor that scans a user's fingerprint. The controller 180 can then use fingerprint information detected by the digital scan sensor as part of an authentication procedure. The digital scanning sensor may also be installed in the display unit 151 or implemented in the user input unit 123. The microphone 122 is shown at one end of the mobile terminal 100, but other locations are possible. If desired, multiple microphones can be implemented with such an arrangement for receiving stereo sounds. The interface unit 160 may serve as a path for the mobile terminal 100 to interface with external devices. For example, the interface unit 160 may include one or more of a connection terminal for connection to another device (e.g., a headset, an external speaker, or the like), a port for in-field communication near (for example, an IrDA port, a Bluetooth port, a wireless LAN port, and the like), or an electrical power supply terminal for supplying power to the mobile terminal 100. The unit interface 160 can be implemented in the form of a connector for housing an external card, such as a subscriber identification module (SIM), a user identification module (UIM), or a memory card for information storage. The second camera 121b is shown located on the rear side of the terminal body and includes an image capture direction which is substantially opposite to the image capture direction of the first camera capture unit. 121a views. In addition, the second camera 121b may include a plurality of lenses arranged along at least one line. The plurality of lenses may also be arranged in a matrix configuration. Cameras may be referred to as "networked cameras". When the second camera 121b is implemented as a networked camera, images can be captured in a variety of ways using the plurality of lenses and images with better qualities. As shown in Fig. 1C, a flash 124 is shown adjacent to the second camera 121b. When an image of a subject is captured with the camera 121b, the flash 124 may illuminate the subject. As shown in Fig. 1C, the second audio output module 152b may be located on the terminal body. The second audio output module 152b may implement stereophonic sound functions together with the first audio output module 152a, and may also be used to implement a speakerphone mode for call communication. [0023] At least one antenna for wireless communication may be located on the terminal body. The antenna can be installed in the terminal body or formed by the housing. For example, an antenna that configures a portion of the broadcast receiving module 111 may be retractable into the terminal body. Alternatively, an antenna may be formed using a film attached to an inner surface of the back cover 103, or a housing that includes a conductive material. An electrical power supply unit 190 for providing power to the mobile terminal 100 may include a battery 191, which is mounted in the terminal body or releasably coupled to an outside of the terminal body. The battery 191 can receive electrical energy via an electrical power source cable connected to the interface unit 160. Also, the battery 191 can be wirelessly recharged using a wireless charger. . The wireless charge can be implemented by magnetic induction or electromagnetic resonance. The rear cover 103 is shown coupled to the rear housing 102 to protect the battery 191, to prevent separation of the battery 191, and to protect the battery 191 from external impact or foreign matter. When the battery 191 is removable from the terminal body, the rear housing 103 can be releasably coupled to the rear housing 102. An accessory to protect an appearance or assist or increase the functions of the mobile terminal 100 can also be provided on the mobile terminal. As an example of an accessory, a cover or cover for covering or housing at least one surface of the mobile terminal 100 may be provided. The cover or the pocket can cooperate with the display unit 151 to increase the function of the mobile terminal 100. Another example of the accessory is a touch pen to assist or increase a touch input on a touch screen. The mobile terminal which may include one or more of the components as described above according to an embodiment of the present invention may display an image received via a camera on a unit display screen. In more detail, the mobile terminal may display an image received via a camera of capture on a real-time display screen unit. Here, an image received through the camera may be called a "preview image", "image", or the like. The mobile terminal according to an embodiment of the present invention can provide an image storage image capture function (preview image) received via a camera in a memory. Here, the storage operation, by the mobile terminal, of an image received via the camera in the memory can be called "capture an image", "obtain an image", "capture a preview image "," capture on a preview image "," process an image formation sequence on a preview image "," perform an image capture function on a preview image ", or like. Also, without being limited to the above-mentioned expressions, any expression can be used freely as long as it means that an image received through a camera is stored in a memory unit. In one embodiment, a mobile terminal may perform an image capture operation based on user selection. Such user selection may be called "user command order" or "command order". User selection can be done in a variety of ways. For example, a user can select an image capture function by touching or pressing a hardware key in the handheld or by touching at least one of a soft key output and a visual key output on the handheld. display 151. Namely, when a hardware key associated with an image capture function is touched or pressed or when at least one of a softkey output and a visual key output on the unit of display 151 is touched, the controller 180 can determine that a user control command to perform an image capture function has been received. Depending on such a control command, the controller 180 may capture an image inputted through the camera 121. Also, in addition to these examples, the image capture function can be performed when a user's voice corresponding to a preset order is received, when a particular gesture is applied to the mobile terminal, or when a preset motion is detected by the mobile terminal. [0024] Meanwhile, in one embodiment of the present invention, the image capturing function can be performed. Execution of the image capture function may refer to running a managed application to capture an image. When the image capture function is executed, the controller 180 may activate the camera 121 in preparation for capturing an image. Also, the controller 180 may output an image inputted from the camera 121 to the display unit 151. In addition, in one embodiment of the present invention, an image input through the activated camera 121 and output to the display unit 151 is defined as a "preview image". As a preview image, an image inputted through the real-time camera 121 can be displayed on the display unit 151. Also, when an image capture operation is performed according to a user selection, the control unit 180 can store an output preview image on the display unit 151 in the memory 170. Hereinafter, an operation of the mobile terminal in the execution of the image capture function will be described with reference to FIG. 2. In particular, FIG. 2 is a conceptual view of a mobile terminal according to an embodiment of the present invention. As described above, the mobile terminal according to one embodiment of the present invention can perform an image capture function. For example, the image capture function may be executed when an icon associated with the image capture function (or an icon of an application) is selected (or touched). When the image capture function is performed via the icon, the display unit 151 may be in a lit state. In addition to the method of selecting the icon associated with the image capture function to execute the image capture function, the image capturing function may also be executed when at least one of the provided keys in the mobile terminal (for example, at least one of a hardware key and a softkey) is selected. In this case, although the display unit 151 is in an off state, the controller 180 may perform the image capture function in response to the selection of a key provided in the mobile terminal. When the image capture function is performed as mentioned above, the controller 180 may display a preview image 300 and a graphics object 302 related to the image capture function on the image processing unit. 151, as shown in (a) of FIG. 2. Namely, the control apparatus 180 can output the graphical object 302 related to the image capturing function on the display unit 151 in FIG. overlap. Here, the graphic object 302 may be at least one of the soft key and the visual key, as described above. Also, depending on a user request, the controller 180 may control the camera to capture a capture on the preview image 300. In more detail, based on a user request applied on the graphical object 302 outputted on the display unit 151, the control apparatus 180 can set functions (for example, setting, image capture mode, switching between a front-view camera). and a rear view camera, the flash, the switch between an inanimate image and a video, the input into memory, and the like) related to the image capture function, and can capture on the preview image 300 according to a touch applied to an image capture button. In one embodiment of the present invention, a state in which the graphical object 302 related to the image capture function overlaps the preview image 300, as shown in FIG. defined as a "first state". In the mobile terminal according to an embodiment of the present invention, when the image capture function is executed, the graphic object 302 may not be output and only the preview image 300 may be output to the unit. 151, as illustrated in (b) of FIG. 2. In addition, the state in which only the preview image 300 is output on the display unit 151, without the graphic object 302 thereon, as illustrated in (b) of Figure 2 may be called a state in which the output of the graphic object 302 on the preview image 300 is limited. Also, in one embodiment of the present invention, the state in which the output of the graphic object 302 on the preview image 300 is limited is defined as a "second" state. In the second state, the controller 180 may capture on the preview image 300 based on a user request. [0025] For example, when a touch applied to a region of the display unit 151 output to the preview image 300 is detected, the controller 180 can process the sensed touch as an image capture order . Also, the controller 180 may store the captured preview image 300 in the memory 151. In other words, the controller 180 may perform the image capture function in any of the first state wherein the graphic object 302 and the preview image 300 are output together and the second state in which the output of the graphic object 302 on the preview image 300 is limited. [0026] Also, the controller 180 can determine the state of the first state and the second state in which the display unit is to operate, based on a user request. In more detail, depending on a user request, the controller 180 may control the display unit 151 to operate in any of the first state in which the graphics object 302 related to the capture function of image overlaps the preview image 300 and the second state in which the output of the graphic object 302 on the preview image 300 is limited. The display unit 151 may operate in any one of the first state in which the graphic object 302 related to the image capture function overlaps the preview image 300 and the second state in which the output of the image The graphic object 302 on the preview image 300 is limited according to a user request. For example, depending on a user request, the controller 180 may output a menu to select any of the first state and the second state, and when the user selects any of the first state and the second state from the menu, the controller 180 may control the display unit 151 to operate in the selected state. In another example, when a pre-set type touch (eg, a quick touch) applied by the user to the display unit 151 is detected, the controller 180 can switch a state of the display. display unit 151 from the first state to the second state or the second state to the first state. When the display unit 151 is in the first state, the controller 180 may capture the preview image 300 based on a user selection (or touch) with respect to the graphic object 302 (button image capture) output to the display unit 151. A method of executing an image capture function on the preview image 300 when the display unit 151 is in the second state will now be described in detail. Hereinafter, the method of performing image capture when the display unit 151 is in the second state will be described in detail with reference to Figs. 3 and 4. Fig. 3 is a flowchart illustrating a method of control of a mobile terminal according to an embodiment of the present invention, and FIG. 4 is a conceptual view illustrating the control method of FIG. 3. The control apparatus 180 can execute the image capture function, and the image capturing function can be performed in a variety of ways depending on selecting (or touching on) an icon associated with the image capturing function (or an icon of an application). When the image capturing function is executed, the controller 180 may activate the camera 121. The controller 180 may output the preview image 300 input via the camera. camera 121 on the display unit 151. The display unit 151 may include a display region (or an execution screen display region) on which an execution screen an application is displayed. [0027] Only the preview image 300 can be output on the display region. Namely, a graphic object related to an image capture may not be displayed on the display unit 151. In other words, the control apparatus 180 may limit the output of a graphical object relating to an image capture. the image capture function. Since the output of a graphic object is limited, only one preview image may be output on the display unit 151 as shown in step S310. Thus, a phenomenon in which a portion of the preview image is covered by a graphical object when the graphical object is output does not occur. That is, the controller 180 may limit the output of a graphics image relating to the image capture function and output the preview image 300 (for example, as a full screen). In more detail, the controller 180 may not output a graphic object that covers a portion of the preview image. The graphic object may include an image capture button, a setting button to change a setting with respect to the preview image 300, a button to enter a gallery to control an image stored in the memory, a button for switching between an inanimate image capture mode and a video capture mode, and the like. When only the preview image 300 is output without overlapping the graphics object 320, the controller 180 may perform the image capture function. In this case, since the output of a graphic object including an image capture button, or the like, is limited, the controller 180 may process a user request as an image capture command in that regarding the preview image 300. [0028] Namely, in the second state in which the output of a graphic object on the display unit 151 is limited, the control apparatus 180 can capture on the preview image 300 according to a request. user. Here, the user request may be a user touch applied to the display unit 151. When a preset type touch is detected, the controller 180 may capture on the preview image 300 other words, in the second state in which the output of the graphic object 302 on the display unit 151 is limited, when a preset type touch is detected in a region in which the preview image 300 is At the output, the controller 180 may process the sensed touch as an image capture sequence in step S320. After that, the controller 180 can perform the detected image capture according to the detected preset type touch and store the captured image in the memory 170. Hereinafter, a method of controlling the capture function of image according to various touches applied to a region on which a preview image is output in the second state will be described. For example, the controller 180 may control types of images captured according to different types of touch. The types of images may include an inanimate image, video, and the like. A method of controlling the mobile terminal 100 will now be described in detail with reference to Fig. 5. When a preset type of touch is applied to a portion of the region of the display unit 151 on which Preview image 300 is output, the controller 180 can capture an image. In this case, the controller 180 can perform image capture regardless of the position at which the preset touch type is applied. When the preset touch type is applied, the controller 180 may adjust a focus with respect to the preview image 300. The focus adjustment function may be performed according to different preset types. of applied touches. Different types of focus adjustment functions can be associated with the different preset types of touch. That is, when any one of the different preset types of touches is detected, the controller 180 may adjust a focus with respect to the preview image 300 according to the method associated with the sensed touch. Also, the controller 180 may adjust focus in consideration of a position on which the preset touch type is applied. For example, when a preset type of touch applied to a first region of the preview region 300 is detected, the controller 180 can adjust a focus, and when a preset type of touch applied to a second region , different from the first region, the preview region is detected, the controller 810 can perform the image capturing function without adjusting the focus. The control method relating to the adjustment of the focus by the control unit 180 will be described in detail with reference to FIG. 6. Thus, when the capture of the preview image is performed as a function of the Preset type detected to be touched, the captured image may be stored in memory 170. This may be understood with reference to Figure 4. As shown in (a) of Figure 4, the controller 180 may output the preview image 300 on the display unit 151 in the second state in which the output of a graphics relating to the image capture is limited. Subsequently, as shown in (b) of Fig. 4, when a preset type of touch applied to the region on which the preview image 300 is output is detected, the controller 180 can capture the image as shown in (c) of Figure 4. [0029] The captured image can be stored in the memory 170. As described above, in the mobile terminal according to one embodiment of the present invention, the output of a graphic object on the preview image is limited and only the preview image is provided, whereby the preview image is not covered by a graphic object during image capture. Also, in the mobile terminal according to an embodiment of the present invention, even in a state in which the output of a graphic object is limited, the preview image can be captured through simple manipulation. Thus, the user can view a sharp image and capture it, and thus, the convenience of the user is increased. [0030] Hereinafter, a method of controlling the image capturing function according to various touches will be described in detail with reference to the accompanying drawings. In particular, Figs. 5A to 5D are conceptual views illustrating a method of controlling an image capture function in various ways according to various touches. The controller 180 may perform various image capture functions and perform an image capture function in different image capture modes. Here, the image capture modes can include an inanimate image capture mode, a video capture mode, a shooting and deletion mode (a mode in which a particular part (subject) included in an image captured is eliminated), High Dynamic Range (HDR) mode, Panorama mode, Virtual Reality (VR) panorama mode, Burst mode, Aesthetic shooting mode (mode in which pixels for which it is found that they have different colors, distinguished from neighboring colors, are changed to an average value of the neighboring colors), a mode with two cameras of captures of images (simultaneously capturing an image using the a front-camera and a rear-view camera), a machine-timer mode (in which images are captured and stored at predetermined time intervals) a memory and a preset number of images are displayed according to a time at which the image capture was performed), a smart photo mode (mode in which a most appropriate image capture mode is set to each situation and an image capture is therefore performed), a sport mode (mode in which a shutter speed is set to be faster to capture the moments), a night mode (mode in which an aperture is enlarged to increase an amount of light input), and the like. The controller 180 may capture the preview image 300 in different image capture modes according to different preset types of touch. Here, the different preset types of touches can include a short touch (or tap), a long touch, a touch-and-shoot, a quick hit, a touch-and-slide, and a stationary floating touch. Different functions relating to the image capture can be associated with the different preset types of touch in the memory 170. Namely, when any of the different preset types of touch is detected, the controller 180 can perform a function. associated with detected touch by referring to information (associated information) in association and stored in memory 170. [0031] For example, as illustrated in (a) of Fig. 5A, when a first touch type to which a control command to capture an image in the inanimate image mode from the preset types is detected in the region on which the preview image 300 is output, the controller 180 can capture an inanimate image. For example, when the first type of touch (for example, a short touch or click) applied to the region on which the preview image 300 is output is detected, an inanimate image as shown in (b) of the figure 5A can be captured. In another example, as illustrated in (a) of Fig. 5B, when a second type of touch, which corresponds to a control command for capturing an image in a video capture mode different from the first touch type among the preset types of touch, applied to the region on which the preview image 300 is output, a video can be captured. For example, when a second type of touch (for example, a long touch) applied to the region on which the preview image 300 is output is detected, a video may be captured as shown in (b) of the Figure 5B. After that, in a state in which a video is captured according to the second type of touch, when the second type of touch is detected again or a type of touch (for example, a short touch) different from the second type is detected, the controller 180 may stop capturing the video. In another example, as shown in FIG. 5C, the image capture function can be performed on the preview image 300 in the image capture mode associated with a preset type key among different modes of image capture. image capture. In more detail, the preset type of touch may be one of a plurality of touches applied in different directions of motion. The plurality of touches applied in different directions of motion may include a touch-and-pull, a quick touch, and the like. The plurality of touches may be associated with image capture control commands in different image capture modes, respectively. As shown in Fig. 5C, the controller 180 may perform an image capture function on the preview image 300 in an image capture mode associated with the detected touch (one of a plurality of touchdowns). applied in different directions of motion) among the different image capture modes. For example, the preset type of touch may be any of a touch-pull applied in a first direction and a touch-pull applied in a second direction different from the first direction. Also, the touch-and-shoot applied in the first direction can be associated with a first image-capture mode, and the touch-and-shoot applied in the second direction can be associated with a second image-capture mode different from the first mode. image capture. In this case, the controller 180 may perform an image capture function in the image capture mode associated with a touch-and-shoot direction applied in the first and second image capture modes. As illustrated in Fig. 5C, when the first image capture mode is associated with a touch applied in the direction to the left, and when a touch applied to the region in which the preview image 300 is output in the direction to the left, the controller 180 may perform the image capture function in the first image capture mode according to the sensed touch. Also, as shown in Fig. 5C, when any touch (e.g., a touch applied in a downward direction) among a plurality of touches applied in different directions of motion is detected in the second state, the apparatus of control 180 can switch a state of display unit 151 from the second state to the first state in which a graphical object related to the image capture function is outputted. Also, a function of permutation of front / rear cameras may be associated with the preset type of touch. For example, as illustrated in FIG. 5C, a touch-and-pull applied in a third direction may be associated with the function of permutation of front / rear cameras. When a touch applied in the third direction is detected, the control apparatus 180 can switch a camera from the front camera to the rear camera or the camera. from rear camera shots to the front camera. Therefore, in one embodiment of the present invention, without additionally removing a graphic object relating to the image capture function to switch the camera forward / backward, the front-view camera / back can be swapped to the second state in which the output of the graphic object is limited. In another example, as shown in Fig. 5D, the controller 180 may perform the image capturing function depending on whether an object is approaching the display unit 151. The controller 180 can detect an object that approaches a predetermined detection surface via the proximity sensor 141. When the object is detected, the controller 180 can capture the preview image 300. The object may include the fingers, the user's face, and the like. [0032] For convenience of explanation, the term "proximity touch" will be used herein to denote the scenario in which an object is positioned to be near the display unit 151 without coming into contact with it. touchscreen. The term "touch contact" will be used herein to denote the scenario in which an object physically contacts the display unit 151. When the proximity touch is detected, the controller 180 may capture the preview image 300 according to the sensed proximity touch. In more detail, when a proximity touch is detected for a period longer than a preset period, the controller 180 may capture the preview image 300. Thus, in one embodiment of the present invention , the touch in proximity and the contact touch can be clearly differentiated. In more detail, the touch pad is positioned above the display unit 151 at a point when it contacts the display unit 151, and thus, it includes a touch in proximity. In one embodiment of the present invention, when a proximity touch is detected for a period longer than a preset period, the preview image is captured, and thus, a problem in which the mobile terminal recognizes the touch. contact as being a touch in proximity when the user wishes to achieve a touch contact can be resolved. [0033] The controller 180 may output the preview image 300 in a state in which the output of the graphics object is limited. Subsequently, as shown in (a) of FIG. 5D, the control apparatus 180 can detect a proximity touch in which an object is positioned to be in proximity to the display unit 151 without being in contact with the display unit 151. When the proximity touch is detected, as shown in (b) of FIG. 5D, the control unit 180 can capture the preview image 300. Embodiment of the present invention, by performing the image capturing function using a touch in proximity, a foreign object remaining due to a touch can be prevented in advance, and thus, image capture can be performed while viewing the net preview image. As described above, in the mobile terminal according to one embodiment of the present invention, in a state in which the output of a graphic object is limited, various functions relating to the image capture function may be performed when different preset types of touches are detected. Thus, even without a graphical object covering a portion of the preview image, the user can perform his desired image function, and thus the convenience of the user can be increased. [0034] Meanwhile, in the mobile terminal according to one embodiment of the present invention, when a preset type of touch is detected in a state in which the output of a graphic object is limited, a focus with respect to the Preview image can be adjusted according to the detected touch. In particular, Figs. 6A-6G are conceptual views illustrating a focus adjustment process with respect to a preview image using touches applied to a display unit. When only the preview image is output on the display unit 151 (the second state), the controller 180 may adjust a focus with respect to the preview image. In one embodiment, a focus function includes an autofocusing (AF) function for automatic focus adjustment when the mobile terminal is stopped for a predetermined period, a touch autofocus. (TAF) focus adjustment according to a region where a touch is detected when the touch applied to the preview image is detected, and the like. Hereinafter, a method for setting a region in which the focus is adjusted in a preview image by the controller 180 will be described with reference to the accompanying drawings. [0035] As described above with reference to Figs. 3 and 4, the controller 180 may perform the image capture function with a touch when the output of a graphics object relating to the capture function of image is limited. In addition, when a preset type of touch is detected in the region on which the preview image 300 is output in the state in which the output of a graphics object relating to the image capture function is limited, the controller 180 may adjust a focus with respect to the preview image before executing the image capture function. In one embodiment, when a preset type of touch is detected in the region on which the preview image 300 is output, the controller 180 may adjust a focus around the region where the touch has been detected in the region on which the preview image 300 is output. For example, when a preset type of touch (for example, a short touch) is detected in a region on which the preview image 300 is output as shown in (a) of Figure 6A, the The controller 180 may adjust a focus on the region in which the touch (short touch) was detected in the region on which the preview image 300 is output as shown in (b) of Fig. 6A. After that, as shown in (c) of Fig. 6A, the controller 180 can capture an inanimate image with respect to the preview image 300 whose focus has been adjusted around the region in which touch has been detected. Namely, as illustrated in (a), (b), and (c) of Fig. 6A, when a single short touch has been applied the controller 180 can adjust a focus around the region on which the short single short touch was applied, and sequentially capture the inanimate image. Also, when a preset type of touch (e.g., a short touch) is detected in the region on which the preview image 300 is output, the controller 180 may adjust a focus around the region in which which touch has been detected, as shown in (b) of Figure 6A. After that, when a preset type of touch (for example, a short touch) is reapplied to a state in which the focus has been adjusted around the region in which the touch has been detected, the 180 may capture an inanimate image with respect to the adjusted focus preview image 300. In another embodiment, when a preset type of touch (eg, a long touch) is detected in the region. on which the preview image 300 is output in the second state, as shown in (a) of Fig. 6B, the controller 180 can adjust a focus on the region in which the touch (long touch) ) was detected in the region on which the preview image 300 is output, as shown in (b) of Figure 6B. After that, as shown in (c) of Fig. 6B, the controller 180 can capture a video with respect to the adjusted focus preview image 300. As shown in Fig. 6B the controller 180 can perform both focus adjustment and video capture only with the single touch (long touch), or alternatively, when a single touch (long touch) is applied, the controller 180 can only adjust a focus, and after that, when an additional touch is applied, the controller 180 can capture a video. Here, the additional touch may be a touch (long touch) already applied to the region on which the preview image 300 is output or may be a touch (e.g., short touch) different from the touch already applied to perform a function corresponding to the touch already applied (long touch). That is, depending on the type of touch initially inputted, the controller 180 may adjust a focus on the region to which the touch has been applied, as well as determine an image capture function with respect to the touch. 300. Also, since the image capture function is determined according to the initially inputted touch, when a touch (e.g., a short touch) different from the initially inputted touch is detected, the command 180 may process the adjusted focus preview image 300 according to the determined image capture function. The controller 180 can perform focus adjustment and image capture only with a single touch. In more detail, the preset type of touch may include a touch-down applied to the display unit 151 and a touch-up to release a touch applied to the display unit 151. For example, a short touch may include a short-down touch and a short-high touch, and a long touch may include a long-down touch and a long-up touch. When a touch is a short-down touch or a short-touch, or the fact that a touch is a long-down touch or a long-up touch may be determined based on a period during which a touch is maintained between a touch-down and a touch-up. For example, when a period during which a touch is maintained between a touch-down and a touch-up included in the touch is shorter than a preset period, the controller 180 may determine touch as a touch short. Also, when a period during which a touch is maintained between a touch-down and a touch-up included in the touch is longer than a preset period, the controller 180 may determine that the touch is a long touch. . When a touch-down is detected in the display unit 151, the control apparatus 180 can process the detected touch-down as a control command to adjust the focus around a region in which the touch-down has been detected. When the touch-up is detected, the controller 180 can process the detected touch-up as a command to capture an image. Thus, with reference to Figs. 6A and 6B, the controller 180 can perform focus adjustment and image capture only with a single touch. [0036] In another embodiment, as illustrated in (a) of Fig. 6C, the controller 180 may adjust a focus with respect to a preview image based on a touch-pull applied in a preset direction. The controller 180 may adjust a focus according to at least one point included in a path along which the touch-and-shoot is detected. For example, when a touch-pull applied in a preset direction is detected, the controller 180 may adjust a focus according to at least one of a start point 342 and an end point 344 in the region on which the preview image 300 is output. For example, as illustrated in (a) of Fig. 6c, when a touch-pull applied in a preset motion direction including start point 342 and end point is detected, as shown in (b). 6C, the control apparatus 180 can adjust a focus according to the start point 342. After that, as shown in (c) of FIG. 6C, the control apparatus 180 can execute an image capture function on the adjusted focus preview image 300 as a function of the start point 342 in an image capture mode (e.g., the first image capture mode illustrated in the figure 5C) associated with the touch-pull applied in the preset motion direction. Further, the controller 180 may adjust a focus according to the end point 344 included in the touch-and-shoot applied in the preset motion direction, or may adjust a focus according to the start point 342 As well as the end point 344. Also, as described above with reference to Fig. 6A, the controller 180 can perform the focus adjustment only with a single touch (a touch applied in preset motion direction) and image capture in the image capture mode associated with the single touch (the touch applied in the preset motion direction) also, or alternatively, when a single touch (a touch applied in a preset direction of motion) is applied, the controller 180 can perform only the focus adjustment, and after that, when an additional touch is applied, the controller 180 can be re-adjusted. align the image capture function in the associated image capture mode. In another embodiment, as illustrated in (a) of Figure 6D, when a preset type of touch (for example, a touch-and-pull extending to draw a circular pattern from a point of At the beginning of a touch, which will be called "circular touch-pull", hereinafter) is applied, the controller 180 can adjust a focus according to the touch-applied region. For example, as shown in (a) of Fig. 6D, when a circular touch-pull is detected in the region on which the preview image 300 is output, as shown in (b) of the figure 6D, the controller 180 may adjust a focus according to the region in which the touch-pull was detected. After that, when a pre-set type of touch (e.g., a short touch, a long touch, a touch applied in the preset direction) different from the applied touch is detected, the controller 180 can perform the capture function. on the adjusted focus preview image 300. Here, when an image capture mode is associated with the circular touch-pull, the controller 180 can adjust a focus only according to the single touch (the circular touch-shoot) and subsequently perform a capture on the preview image in the associated image capture mode. In another example, as illustrated in (a) of Fig. 6E, when a touch-pull including at least one intersection point is applied, as shown in (b) of Fig. 6E, The controller 180 may adjust a focus according to the at least one point of intersection. After that, when the preset touch type is detected, the controller 180 may perform the image capture function on the preview image 300 whose focus has been adjusted according to the at least one intersection point, as a function of sensed touch, as shown in (c) of Figure 6E. In addition, the controller 180 may adjust a focus according to the at least one region 364 including the at least one intersection point. When a pre-set type of touch (e.g., a short touch, a long touch, or a touch applied in a preset direction) different from the applied touch is detected, the controller 180 may perform the image capture function on the preview image 300 whose focus has been adjusted according to the sensed touch (for example, a short touch, a long touch, or a touch applied in a preset direction). Similarly, as described above with reference to FIG. 6D, when an image capture mode is associated with the touch-shoot including the at least one intersection point, the control apparatus 180 can adjust focus only according to the single touch (the touch-shoot including at least one intersection point) and subsequently capture on the preview image in the associated image capture mode. In another embodiment, as shown in (a) of Fig. 6F, the controller 180 may divide the region of the preview image 300. Also, the controller 180 may perform various functions. image capturing according to the region, among the plurality of divided regions, wherein a touch is detected. For example, the controller 180 may adjust a focus according to a touch point applied to the divided regions 332 and 334 and execute the image capture function, or may directly perform the image capture function without adjustment. of focus. [0037] Further, the controller 180 may display a guide line 330 indicating that the preview image has been split on the preview image 300. For example, as shown in (b) of Fig. 6F, when a preset type of touch is applied to the first region 332, the controller 180 may adjust a focus according to the region to which the touch has been applied, and perform the image capture function on the preview image 300. Also, as shown in (c) of Fig. 6F, when a preset type of touch is applied to the second region 334 different from the first region 332 among the divided regions, the apparatus The controller 180 may immediately perform the image capture function on the preview image 300 without focusing the focus according to the touch. In one embodiment of the present invention, depending on whether or not the focus has been adjusted with respect to an image, an image capture method for execution when a preset type of touch is detected may vary. In more detail, depending on whether the focus of the preview image has been adjusted or not, the controller 180 may perform different functions with respect to the same type of touch. First, when the focus with respect to the preview image 300 has been adjusted according to a continuous auto focus (CAF) function automatically executed when the mobile terminal is paused for a predetermined period of time, and when a preset type of touch (e.g., a short touch) is detected, the controller 180 may capture on the preview image 300 without performing focus adjustment. Also, a focus with respect to the preview image 300 has been adjusted according to a touch autofocus (TAF) function performed to adjust a focus according to a region in which a touch is detected, depending on the sensed touch in the preview image 300, when a preset type of touch (e.g., a short touch) is detected, the controller 180 may capture on the preview image 300 without furthermore making adjustment of focus. In another example, in a state in which the focus has been adjusted on the preview image 300 by the CAF function or in a state in which the focus has been adjusted on the preview image 300 by the TAF function, and when a preset type of touch (for example, a short touch) is detected after the lapse of a preset period, the controller 180 may readjust a focus according to the region in which the preset type of touch has been detected. [0038] In another example, as shown in FIG. 6G, the controller 180 may adjust a focus with respect to the preview image 300 as a function of an object approximating the unit of The controller 180 may detect an object that approaches a predetermined detection surface via the proximity sensor 141. When the object is detected, the controller 180 may capture the object. preview image 300. The object may include the fingers, the user's face, and the like. When the proximity touch is detected, the controller 180 may adjust a focus with respect to the preview image 300 based on the sensed proximity touch. In more detail, when the proximity touch is detected, the controller 180 may adjust a focus with respect to the preview region corresponding to the sensed proximity touch. Also, when the proximity touch is detected for a longer period than a preset period, the controller 180 can adjust a focus with respect to the region of the preview image corresponding to the region in which the touch in proximity is detected. As described above, in one embodiment of the present invention, when the preset period has elapsed, the focus is adjusted, whereby a touch in proximity and a touch feel can be clearly differentiated. As illustrated in (a) of FIG. 6G, in a state in which the output of a graphic object is limited, the control apparatus 180 can detect a touch in proximity that an object is positioned to be in close proximity. of the display unit 151 without being in contact with the display unit 151. When the proximity touch is detected, the control apparatus 180 can adjust a focus with respect to a region of the preview image corresponding to detected proximity touch. After that, as shown in (b) of Figure 6G, the controller 180 can capture the preview image 300 based on whether a touch is applied to a region of the preview image at adjusted focus. In another example, when the focus of the preview image has been adjusted, the controller 180 may capture the preview image 300 based on a voice signal inputted from the outside. As described above, in the mobile terminal according to one embodiment of the present invention, according to a preset touch type detected in the second state in which the output of a graphic object is limited, a setting of at the point regarding a preview image is adjusted and an image capture function can be performed. Thus, the user can use the image capture function on the net preview image not covered by a graphic object and adjust a focus with respect to the preview image via a simple manipulation, and thus, the user's requirements for capturing a high quality image are met. [0039] Hereinafter, a method for swapping a second state in which the output of a graphic object on a display unit is limited to a first state in which a graphic object relating to an image capture function overlaps a preview image will be described in detail. In particular, Figs. 7A through 7C are conceptual views illustrating a second state permutation embodiment in which the output of a graphics object is limited to a first state in which a graphics object is output. In the mobile terminal according to one embodiment of the present invention, the permutation can be performed between the first state and the second state. Such a permutation can be performed according to a user request. In more detail, according to a preset type of touch detected in a region on which the preview image 300 is output in the second state, the controller 180 may switch a state of the display unit 151 of the second state to the first state in which the graphic object 302 overlaps the preview image 300. When any of a plurality of touches applied in different directions of motion, among preset types of touches, is detected in the second state, the controller 180 may output the graphic object 302 relating to the image capture function on the overlapping preview image 300. [0040] For example, as illustrated in (a) of Fig. 7A and Fig. 5C, when any touch (e.g., a touch-and-pull applied in a downward direction) among a plurality of touches applied in different directions of motion is detected, as shown in (b) of FIG. 7A, the control apparatus 180 can output the graphic object 302 relating to the image capture function. Also, when any of a plurality of touches applied in different directions of motion, among preset types of touches, to the region in which the preview image 300 is output is detected in the second state, the controller 108 can determine a position on which a graphic object is to be output depending on the direction of movement of the touch. For example, as illustrated in (a) of FIG. 7B, when any touch (e.g., a touch-and-pull applied in one direction to the right) among a plurality of touches applied in different directions of motion, among preset types of touches, is detected in the second state, depending on the direction of movement (e.g., direction to the right) of the sensed touch, the control apparatus 180 may output graphical objects 302a on one side ( left side) and the other side (right side) corresponding to the direction of movement of the detected touch. [0041] Further, as shown in (b) of Fig. 7B, when the graphical objects 302a are out, and when any touch (e.g., a touch applied in a direction to the right), among the plurality of touches , is applied again, the control device 180 can also output a graphic object 302b different from the graphical objects already output 302a. Here, preferably, the different graphic object 302b is based on an image capture belonging to a sub-category of the graphic object 302a that has already been released. However, the present invention is not limited to this and the other graphic objects 302b may be a graphic object different from the graphical objects already outputted 302a. In addition, when a preset type of touch (e.g., a touch applied in a direction to the left) is applied in the first state in which a graphical object is output, the controller 180 can switch a state of the art. display unit 151 from the first state to a second state in which the output of the graphic object is limited. In another example, when any touch (e.g., a touch-and-pull applied in one direction to the right) among a plurality of touches applied in different directions, among preset types of touches, is detected in the second state , the controller 180 may perform a function associated with any touch, instead of outputting the graphic object relating to the image capture function. For example, the function associated with any touch may be a function of switching cameras forward / backward, as shown in Figure 5C. In addition, (a) of Fig. 7C is a view illustrating a state in which the front camera is activated. When a touch-pull applied in a direction to the right is detected in the second state in which the output of a graphic object is limited, as illustrated in (a) of FIG. 7C, the control apparatus 180 Can swap the camera from the front camera to the rear view camera by performing the camera swap function with the touch-and-shoot applied in the direction to the right, as shown in (b) of Figure 7C. Also, when the rear view camera is activated, and when a touch (for example, a touch-and-shoot applied in the right direction) associated with the camera swap function before / If the rear camera is detected, the control unit 180 can switch the camera from the rear camera to the front camera. Meanwhile, the mobile terminal can output a graphic object according to various types of touch. Now, Figs. 8A-8C are conceptual views illustrating another embodiment of permutation of a second state in which the output of a graphical object is limited to a first state in which a graphical object is outputted. In the second state in which the output of a graphic object on the preview image 300 output on the display unit 151 is limited, and when touches are detected at a plurality of points of the display unit 151, the control apparatus 180 can switch the state of the display unit 151 of the second state to the first state in which the graphical object 302 related to the image capture function overlaps the preview image 300 Here, the change of the state of the display unit 151 from the second state to the first state can refer to the output of the graphical object. [0042] In addition, when the plurality of points include first and second touches, the controller 180 may output the graphical object on the vicinity of a touch point of at least one of the first and second touches. In one embodiment of the present invention, when touches are detected at a plurality of points 352 and 354 of the display unit 151, as shown in (a) of Fig. 8A, the controller 180 may output a graphic object 302 relating to the image capture function on the vicinity of a touch point of any touch 352 among the first touch 352 and the second touch 354 corresponding to the plurality of detected points, such as is shown in (b) of Figure 8A. Here, the output graphic object 302 relating to the image capture function can refer to touch types detected on the plurality of points. In more detail, when touches detected on a plurality of points of the display unit 151 correspond to any one (for example, a long touch) among preset types of touches, the graphical object 302 may be outputted. In addition, the output graphical object 302 may be a plurality of objects. When any of the plurality of graphic objects 30 is selected (or touched), the plurality of graphic objects 302 may disappear. After that, when a touch corresponding to any one (for example, a long touch) of preset touch types is detected on a point, rather than a plurality of points, the controller 180 can capture an image by one or more points. intermediate of a function corresponding to a graphic object selected (or touched) by the user. For example, as illustrated in (a) of Figure 8A, when a plurality of touches are detected in a preset manner (long touch type), a graphical object may be output on the vicinity of the touch point of the minus one touch 352 among the plurality of touches 352 and 354, as illustrated in (b) of FIG. 8A. The user can select any (continuous shooting) graphic objects output. After that, as shown in (a) of Fig. 5B, when a touch corresponding to a preset type (long touch type) is detected on a point, the controller 180 can capture an image via a function (continuous shooting) corresponding to the selected graphic object. [0043] In another embodiment, as illustrated in (a) of Fig. 8B, when touches are detected at a plurality of points 352 and 354 of the display unit 151, as shown in (b) of FIG. 8B, the control apparatus 180 can output a graphic object 302 relating to the image capture function to the vicinity of the touch point of the at least one touch 352 among the first touch 352 and the second touch 354 corresponding to the plurality of detected points. In addition, the graphic object 302 relating to the output image capture function may relate to a type of touch detected on the plurality of points. In more detail, when the touches detected on the plurality of points of the display unit 151 are detected as being of any type (e.g., a touch-and-pull applied in a preset direction) of the touch, the command 180 can output a graphic object. The graphic object may be a plurality of objects. When any of the plurality of graphic objects is selected, the output graphical object may disappear. After that, when a touch applied according to any one (e.g., a touch-and-pull applied in a preset direction) of the preset types of touch is detected on a point, the controller 180 may capture an image via a function corresponding to the selected graphic object. [0044] For example, as shown in (a) of Figure 8B, when a plurality of touches are detected in a preset manner (eg, a touch-and-pull applied in a direction to the left), a graphical object may be output on the vicinity of the touch point of the at least one touch 352 of the plurality of touches 352 and 354, as shown in (b) of Figure 8A. The user can select any one (for example, a timer) from among the graphical objects that are output. In this case, the control apparatus 180 can determine the function (e.g. the timer) corresponding to the selected graphic object, as a function to be performed when a touch, rather than a plurality of touches, is detected as a preset type (for example, a touch-and-pull applied in a direction to the left). After that, as shown in Fig. 5C, when a touch corresponding to the preset type (a touch-and-pull applied in a direction to the left) is detected on a point, the controller 180 can capture an image by through the function (timer) corresponding to the selected graphic object. In another embodiment, although touches are not detected at a plurality of points, the controller 180 may switch a state of the display unit 151 from the second state to the first state. That is, when a preset type of touch is detected at a point, the controller 180 may output the graphic object relating to the image capture function to the vicinity of the touch point of the sensed touch. For example, when a preset type of touch (e.g., a long touch) is detected in a region on which the preview image 300 is output, as shown in (a) of FIG. 180 may output the graphic object 302 relating to the image capture on the vicinity of the detected touch, as illustrated in (a) of FIG. 8C, rather than immediately capturing a video with respect to the preview image, as shown in (b) of Figure 5B or rather than adjusting the focus, as shown in (b) of Figure 6B. After that, when any one of the plurality of graphic objects 302 is selected according to a user request, the control apparatus 180 may execute a function (eg, continuous shooting) corresponding to the selected graphic object ( for example, a burst icon). [0045] As illustrated in (b) of Fig. 8C, when a touch-pull extending from a touch applied to any point in the region on which the preview image 300 is output is maintained at a point (long touch), the controller 180 may output the graphic object 302 relating to the image capture on the vicinity of any point where the touch-shoot is maintained. As described above, in the mobile terminal according to an embodiment of the present invention, even in the second state, the mobile terminal 100 can be switched to the first state by simple manipulation. Also, when preset types of touches are detected at a plurality of points, a graphical object for setting an image capturing function to be performed when the preset type of touch is detected at a point may be outputted. Thus, even without a graphic object, the user can easily associate the image capture function corresponding to his desired type of touch, and easily perform the associated image capture function by applying the touch again. [0046] Hereinafter, a method of providing an additional function with respect to an image captured after the image is captured in the mobile terminal according to an embodiment of the present invention will be described in detail. In particular, Figs. 9A to 9E are conceptual views illustrating a method of controlling captured images in the second state in which the output of a graphic object is limited. As illustrated in (a) of Fig. 9A, in the second state, when a preset touch is detected in a region on which the preview image 300 is output, the preview image 300 may be captured. [0047] After that, as shown in (b) of Fig. 9A, the controller 180 may output a thumbnail 400 with respect to the captured image on a region of the region on which preview image 300 is outputted. The thumbnail 400 may overlap the preview image 300. In addition, the thumbnail 400 may be a thumbnail with respect to the most recently captured image. Also, the vignette 400 may disappear when a preset period has elapsed or according to a user request. When a short touch is detected on the thumbnail 400, the control unit 400 can enter a gallery to output an image stored in the memory 170. In addition, with the thumbnail 400 output, when a preset type of touch ( for example, a long touch) is applied to the sticker 400, as illustrated in (c) of FIG. 9A, the control device 180 can output a captured image 500 corresponding to the sticker 400 on the region in which the Preview image 300 is output, as shown in (d) of Figure 9A. The thumbnail 400 may overlap the preview image 300. [0048] Also, as shown in (d) of Figure 8A, thumbnail 400 for the captured image and thumbnails 401 and 402 for images that were captured before the most recently captured image. 500 and stored in the memory can be output together on the image captured 500 overlapping output. Also, the controller 180 may output a trash graphics object 600 performing a function of deleting an image captured on the display unit 151. The control apparatus 180 may also determine whether to output to the display. at least one of the thumbnails 400, 401, and 402 and the trash graphic object 600 according to conditions or circumstances. For example, as shown in (d) of Fig. 9A, while a long touch is maintained, the controller 180 may output at least one of thumbnails 400, 401, and 402 and the graphic object of the basket 600 on the display unit 151. Also, when the long touch release is detected, the control unit 180 can switch a state of the display unit 151 to the second state so that only the preview image 300 is output, as shown in (a) of FIG. 9A. As illustrated in (a) of Fig. 9B, when a long touch is applied to the thumbnail 400 with respect to the captured image, the controller 180 may output the image 500 corresponding to the thumbnail 400 over the region in which the preview image 300 is output, overlapping. After that, as shown in (b) of Fig. 9B, when a touch extending from the applied long touch is detected on the thumbnail 401 with respect to the previously captured image, the controller 180 may switch the output image 500 to an image 501 corresponding to the thumbnail 401 on which the touch extending from the long touch is detected. In other words, when the long touch is maintained and extends from the first thumbnail 400 to the second thumbnail 401, the control unit 180 can switch the image 500 corresponding to the first thumbnail 400 to the image 501. 20 corresponding to the second thumbnail and output the image 501 on the display unit 151. As shown in (c) of Fig. 9B, with the long touch detected in the thumbnail 401 maintained, when a touch- A long-touching pull from thumbnail 401 to the trash graphic object is detected, the previously captured image 501 corresponding to the detected long-touch thumbnail 401 can be deleted. Here, the thumbnail 401 can be handled as separate data from the original image 501. In this case, when the detected long-touch thumbnail 401 is pulled to the trash graphics object 600, the The controller 180 can delete both the thumbnail 401 and the original image 501 corresponding to the thumbnail 401. Here, in addition to the case where both the thumbnail 401 and the corresponding original image. at the thumbnail are deleted together, only any of the thumbnail 401 and the original image corresponding to the thumbnail can be deleted. [0049] For example, when a long touch on any one of the plurality of thumbnails 400, 401, and 402 is detected, and a touch extending from the thumbnail 401 on which the long touch is selectively detected to the object Trash map 600 is detected, the controller 180 can delete the selected thumbnail 401 and the previously captured image 501 corresponding to the selected thumbnail 401 from the memory unit. When a long touch is released in any of the states illustrated in (a), (b), and (c) of Figure 9B, the controller 180 may reset the status of the display unit 151. in the second state in which the output of a graphic object is limited, as shown in (d) of Figure 9B. Further, in one embodiment of the present invention, an image previously stored in the memory may be output even before the preview image is captured, as well as after the preview image is captured. In more detail, in a state in which the output of a graphics object relating to the image capture function is limited, when a preset type of touch is detected, the control device 180 can output a thumbnail by regards an already captured image on the display unit. In this case, at least one vignette can be output. The thumbnail can be output by a preset touch type applied to a preset region. The preset region may be a region on which the thumbnail is to be output. At least one region (hereinafter referred to as the "thumbnail region") on which the thumbnail is to be output may be positioned in at least a portion of the region on which the preview image is output. That is, the controller 180 may allocate at least a portion of the region in which the preview image is output as a thumbnail region. The thumbnail region can be set when an application or software has been created or can be set according to a user request. As shown in (a) of Fig. 9C, the controller 180 can adjust at least one region 700 (thumbnail region) to output a thumbnail. [0050] After that, when a preset type of touch is detected in the thumbnail region, the controller 180 can output at least one thumbnail 400 with respect to an image stored in the memory 170 on the thumbnail region 700, such as this is illustrated in (b) of Figure 9C. With at least one thumbnail 400 output, when a touch (touch short) on any of the output thumbnails 400 is detected, the controller 180 can enter a gallery to output an image stored in the memory 170. Also with thumbnails 400 outputs, when a touch (long touch) on any of the output thumbnails 400 is detected, the controller 180 may provide a command to perform such a function, as described above in referring to Figures 9A and 9B. In addition, a display size and the number of thumbnails output can be determined according to the area of a touch detected on the display unit 151. Also, the thumbnail region 700 can be determined according to the a surface of a touch sensed on the display unit 151. In more detail, the control apparatus 180 can detect the area of a touch detected on the display unit 151. Depending on the area, the controller 180 may adjust a size of at least one region (thumbnail region) to output a thumbnail. For example, a size of the thumbnail region may be proportional to the area of touch detected on the display unit 151. The control apparatus 180 may determine at least one of the number and display size of the display area. thumbnail output according to the size of the thumbnail region, and output one or more thumbnails (s) on the thumbnail according to the determined number and display size of the thumbnail (s). For example, as illustrated in (a) of Fig. 9D, when the area of a touch sensed in the thumbnail region 700 is narrow, the controller 180 can adjust (for example, reduce) the size of the the thumbnail region 700 and output the thumbnail 400 according to the size of the thumbnail adjusted region 700. As shown in (b) of Fig. 9D, when the area of a touch detected in the thumbnail region 700 is large, the controller 180 can adjust the size of the thumbnail region 700 to a size corresponding to the area and output the thumbnail 400 according to the adjusted size of the thumbnail region 700. When (a) and ( b) Figures 9D are compared, (a) has a narrow touch area, relative to (b), and thus, a larger number of thumbnails having a small size can be output, while (b) has a large touch area, compared to (a), a smaller shade of vignettes with a large size can be output. With this configuration, in one embodiment of the present invention, in a state in which the output of a graphic object is limited, the user's needs for controlling an image stored in the memory are satisfied even before a preview image is captured. Further, in one embodiment of the present invention, in the second state in which the output of a graphic object is limited, a thumbnail stored in the memory may be output using an image analysis function. In more detail, when a preset type of touch is detected on a subject included in a preview image, the controller 180 may output a thumbnail with respect to an image obtained by capturing the subject corresponding to the detected touch. For example, as shown in (a) of Figure 9E, when a preset type of touch (eg, long touch) is detected on a subject output on the preview image 300, the controller 180 can perform an image analysis on the subject. Based on the image analysis results, the controller 180 can extract an image corresponding to the results from images stored in the memory 170, and output the extracted image to the display unit 151. In this case, the controller 180 may output the extracted image to the vicinity of the region in which the preset touch type has been detected or to a preset region (a region to output a thumbnail). With this configuration, in one embodiment of the present invention, even when the output of a graphic object is limited, the subject-related image included in the preview image can be outputted. Thus, in one embodiment of the present invention, the captured image of the subject can be easily controlled through simple manipulation, and the user's need to capture images of various figures of a subject. are satisfied. As described above, in the mobile terminal according to one embodiment of the present invention, a thumbnail for controlling a captured image can be output, and the captured image can be controlled and deleted using the thumbnail. Thus, since the user can control and delete the captured image even in the second state in which a graphic object is not output, the convenience of the user can be increased. [0051] Hereinafter, another embodiment in which an image capture function is performed in the second state in which the output of a graphic object is limited will be described in detail. In particular, Figs. 10A to 10D are conceptual views illustrating a method of performing an image capture function in the second state in which the output of a graphics object is limited. The controller 180 may perform image capture on a preview image in the second state using a different hardware configuration provided in the mobile terminal. For example, when a preset type of touch is applied to the microphone 122 of the mobile terminal, the controller 180 may capture an image on the preview image 300 output in the second state. When a pre-set tap or touch type is detected on the microphone 122, as shown in Fig. 10A, or when the microphone 122 is covered to block the noise, as shown in Fig. 10B, the apparatus 180 can capture an image on the preview image 300 displayed on the display unit. In another example, as shown in Fig. 10C, the controller 180 can detect the movement of a user through the camera unit 121a and perform a capture function. image on the preview image 300 based on the detected motion. In addition, the control apparatus 180 can recognize the face of a user, as well as the movement of a user, through the camera 121a and capture an image on the camera. Also, the controller 180 may perform the image capture function on the preview image 300 based on the motion of a sensed user through the proximity sensor 141. The motion the user may include a cover movement of the proximity sensor 141 for a preset period, or the like, as shown in Fig. 10D. Further, the embodiment of the present invention as described above may be installed as a basic function of a mobile terminal when the mobile terminal is offered for sale or may be provided as a mobile terminal. application that can be downloaded through an external server using wireless communication. Thus, when a downloaded application is installed in the mobile terminal, the functions according to the embodiment of the present invention can be provided in the mobile terminal. In the embodiments of the present invention, the foregoing method can be implemented as codes that can be read by a processor in a stored program bearer. The processor readable medium may include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like. The processor-readable medium also includes implementations in the form of carrier waves (e.g., transmission over the Internet). [0052] The mobile terminal according to the embodiments of the present invention is not limited in its application of configurations and methods, but all or part of the embodiments may be selectively associated to be configured into various modifications. The foregoing embodiments and advantages are merely illustrative and should not be construed as limiting the present invention. The present teachings can be easily applied to other types of apparatus. This description is intended to be illustrative, and not to limit the scope of the claims. Those skilled in the art will realize that many other possibilities, modifications, and variations are possible. The features, structures, methods, and other features of the embodiments described herein may be combined in various ways to provide additional and / or other embodiments. Since the present features may be embodied in several forms without departing from their features, it is also to be understood that the embodiments described above are not limited by any of the details of the foregoing description unless otherwise indicated, but rather should be be considered generally within the scope as defined in the appended claims, and therefore all changes and modifications within the limits and limits of the claims, or equivalents of such boundaries and boundaries, are therefore intended to be encompassed by the appended claims.
权利要求:
Claims (15) [0001] REVENDICATIONS1. A mobile terminal, comprising: a wireless communication unit (110) configured to perform wireless communication; a camera (121) configured to obtain an image; a display unit (151) configured to display a preview image obtained through the camera (121); and a control apparatus (180) configured to: control the display unit (151) to operate in any one of a first state, wherein a graphic object relating to an image capture function is displayed overlapping the a preview image, and a second state, in which the graphic object is not displayed while the preview image is displayed, according to a user request, and when a first preset type of touch is detected in a region on which the preview image is displayed in the second state, controlling the camera (121) to capture the preview image according to the preset type of touch. [0002] The mobile terminal of claim 1, wherein when the display unit (151) is in the second state, the control apparatus (180) is further configured to perform different functions according to different preset types of detected in the region where the preview image is displayed. [0003] The mobile terminal according to any one of claims 1 and 2, wherein the control apparatus (180) is further for controlling the camera (121) for capturing an inanimate image of the preview image by function of the first preset type of touch and to capture a video of the preview image according to a preset type different from touch. [0004] The mobile terminal of any one of claims 1 to 3, wherein the first preset type of touch is any of a plurality of touchesapplied in different directions of motion, and the plurality of touches are associated with different modes of image capture. , respectively, and wherein the control apparatus (180) is further configured to perform the image capture function on the preview image in an image capture mode among the different image capture modes associated with the first preset type detected to touch. [0005] A mobile terminal according to any one of claims 1 to 4, wherein the first preset type of touch is any of a touch-pull applied in a first direction and a touch-pull applied in a second direction different from the first direction, wherein the touch-pull applied in the first direction is associated with a first image capture mode and the touch-pull applied in the second direction is associated with a second image capture mode different from the first capture mode. in which the control apparatus (180) is further configured to perform the image capture function on the preview image in the first or second image capture mode associated with the direction of the image. touch-pull applied. 20 [0006] A mobile terminal according to any one of claims 1 to 5, wherein when a second preset type of touch is detected in the region on which the preview image is displayed in the second state, the control apparatus (180) is further configured to adjust a focus of the preview image based on a region in which the touch is detected. 25 [0007] A mobile terminal according to any one of claims 1 to 6, wherein a second preset touch type corresponding to a touch-and-pull applied in a direction of motion preset in the second state is detected, the control apparatus (180) is further configured to adjust a focus of the preview image based on at least one of a start point and an end point of the touch-drag on the preview image. [0008] A mobile terminal according to any one of claims 1 to 7, wherein when the first preset type of touch includes touches detected on a plurality of points on the display unit (151) in the second state, the control (180) is further configured to switch a state of the display unit (151) from the second state to the first state. [0009] The mobile terminal of claim 8, wherein the touches on the plurality of points include a first touch and a second touch, and wherein the control apparatus (180) is further configured to display the graphical object on a plurality of points. neighborhood of a touch point of at least one of the first and second touches. [0010] The mobile terminal of any one of claims 1 to 9, wherein when the first preset type of touch is a short touch, the control apparatus (180) is further configured to capture the preview image as a touch screen. Inanimate image and when the first preset type of touch is a long touch, the controller (180) is further configured to capture a video of the image obtained through the camera (121). ). [0011] A method of controlling a mobile terminal, the method comprising: displaying, through a display unit (151) of the mobile terminal, a preview image obtained through a camera (121) for capturing the mobile terminal; controlling, via a control device (180) of the mobile terminal, the display unit (151) to operate in any one of a first state in which a graphic object relating to a function of image capture is displayed overlapping the preview image and a second state in which the graphic object is not displayed while the preview image is displayed, according to a user request; and when a first preset type of touch is detected in a region on which the preview image is displayed in the second state, the control, through the control apparatus (180), of the captures (121) to capture the preview image according to the preset type of touch. [0012] The method of claim 11, wherein when the display unit (151) is in the second state, the method further comprises executing different functions according to different preset types of touches detected in the region on which preview image is displayed. [0013] The method of any of claims 11 and 12, further comprising: controlling the camera (121) to capture an inanimate image of the preview image based on the first preset type of touch and for capture a video of the preview image according to a preset type different from touch. [0014] The method of any one of claims 11 to 13, wherein the first preset type of touch is any of a plurality of touches applied in different directions of motion, and the plurality of touches are associated with different modes of image capture. , respectively, and wherein the method further comprises executing the image capture function on the preview image in an image capture mode among the different image capture modes associated with the first type preset detected to touch. [0015] The method of any one of claims 11 to 14, wherein the first pre-set type of touch is any of a touch-and-shoot applied in a first direction and a touch-and-pull applied in a second direction different from the first direction, wherein the touch-pull applied in the first direction is associated with a first image capture mode and the touch-pull applied in the second direction is associated with a second image capture mode different from the first capture mode. image, andwherein the method further comprises executing the image capturing function on the preview image in the first or second image capture mode associated with the applied touch-shoot direction.
类似技术:
公开号 | 公开日 | 专利标题 FR3021133B1|2019-08-30|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL FR3022368B1|2019-06-21|WATCH-TYPE TERMINAL AND CONTROL METHOD THEREOF FR3031601B1|2019-08-30|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3021424B1|2019-09-20|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL US10154186B2|2018-12-11|Mobile terminal and method for controlling the same US10372322B2|2019-08-06|Mobile terminal and method for controlling the same FR3024786A1|2016-02-12|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME FR3026201A1|2016-03-25| FR3021425A1|2015-11-27| FR3021766A1|2015-12-04|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL FR3025328B1|2019-07-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3043478A1|2017-05-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3022367A1|2015-12-18| FR3021136A1|2015-11-20|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3022649A1|2015-12-25| FR3021767A1|2015-12-04|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME FR3039673A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3019665A1|2015-10-09| FR3039674A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3021135A1|2015-11-20| FR3021485A1|2015-11-27|MOBILE DEVICE AND METHOD OF CONTROLLING THE SAME FR3046470B1|2019-11-08|MOBILE TERMINAL FR3041785A1|2017-03-31|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3022648A1|2015-12-25| FR3042084B1|2019-11-08|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
同族专利:
公开号 | 公开日 FR3021133B1|2019-08-30| US20150334291A1|2015-11-19| EP2947867A1|2015-11-25| KR20150133056A|2015-11-27| KR102158214B1|2020-09-22| CN105100388A|2015-11-25| EP2947867B1|2019-09-18| US9787890B2|2017-10-10| CN105100388B|2019-06-21|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US7551223B2|2002-12-26|2009-06-23|Sony Corporation|Apparatus, method, and computer program for imaging and automatic focusing| KR20040083788A|2003-03-25|2004-10-06|삼성전자주식회사|Portable communication terminal capable of operating program using a gesture command and program operating method using thereof| US8479122B2|2004-07-30|2013-07-02|Apple Inc.|Gestures for touch sensitive input devices| KR101491594B1|2007-11-05|2015-02-09|삼성전자주식회사|Portable terminal having touch screen and method for processing image thereof| US10503376B2|2007-12-20|2019-12-10|Samsung Electronics Co., Ltd.|Method and apparatus for adjusting an image and control guides displayed on a display| KR20090066368A|2007-12-20|2009-06-24|삼성전자주식회사|Portable terminal having touch screen and method for performing function thereof| JP5083049B2|2008-06-05|2012-11-28|富士通株式会社|Portable terminal device, preview display method, and program having display function| TWI422020B|2008-12-08|2014-01-01|Sony Corp|Solid-state imaging device| KR20100071754A|2008-12-19|2010-06-29|삼성전자주식회사|Photographing method according to multi input scheme through touch and key manipulation and photographing apparatus using the same| EP2207342B1|2009-01-07|2017-12-06|LG Electronics Inc.|Mobile terminal and camera image control method thereof| KR101598335B1|2009-06-11|2016-02-29|엘지전자 주식회사|Operating a Mobile Termianl| EP2393000B1|2010-06-04|2019-08-07|Lg Electronics Inc.|Mobile terminal capable of providing multiplayer game and method of controlling operation of the mobile terminal| US8638385B2|2011-06-05|2014-01-28|Apple Inc.|Device, method, and graphical user interface for accessing an application in a locked device| JP5532033B2|2011-09-16|2014-06-25|カシオ計算機株式会社|Imaging apparatus, imaging method, and program| JP5816571B2|2012-02-21|2015-11-18|京セラ株式会社|Mobile terminal, shooting key control program, and shooting key control method| KR20130143381A|2012-06-21|2013-12-31|삼성전자주식회사|Digital photographing apparatus and method for controlling the same| CN102902452A|2012-08-06|2013-01-30|北京小米科技有限责任公司|Method for photographing image and mobile terminal| JP6039328B2|2012-09-14|2016-12-07|キヤノン株式会社|Imaging control apparatus and imaging apparatus control method| US9232127B2|2013-04-28|2016-01-05|Moshe Lior Alkouby|Loupe accessory and viewing method|KR102091161B1|2013-12-05|2020-03-19|엘지전자 주식회사|Mobile terminal and control method for the mobile terminal| USD767520S1|2014-03-13|2016-09-27|Lg Electronics Inc.|Cellular phone| EP3182692A4|2014-08-12|2018-03-21|Sony Corporation|Information processing device, program, and information processing method| US9503846B2|2014-08-14|2016-11-22|Nicholas Sandin|Embedded location tracking systems for sports equipment| KR102302197B1|2015-04-01|2021-09-14|삼성전자주식회사|Photographing apparatus, method for controlling the same, and computer-readable recording medium| US10122914B2|2015-04-17|2018-11-06|mPerpetuo, Inc.|Method of controlling a camera using a touch slider| US9838607B2|2015-04-17|2017-12-05|mPerpetuo, Inc.|Passive optical electronic camera viewfinder apparatus| US20170078240A1|2015-09-16|2017-03-16|Whatsapp Inc.|Techniques to select and configure media for media messaging| US9871962B2|2016-03-04|2018-01-16|RollCall, LLC|Movable user interface shutter button for camera| US10225471B2|2016-03-18|2019-03-05|Kenneth L. Poindexter, JR.|System and method for autonomously recording a visual media| US20190174069A1|2016-03-18|2019-06-06|Kenneth L. Poindexter, JR.|System and Method for Autonomously Recording a Visual Media| US9800975B1|2016-04-18|2017-10-24|mPerpetuo, Inc.|Audio system for a digital camera| KR20170131101A|2016-05-20|2017-11-29|엘지전자 주식회사|Mobile terminal and method for controlling the same| US9716825B1|2016-06-12|2017-07-25|Apple Inc.|User interface for camera effects| KR20180061652A|2016-11-30|2018-06-08|엘지전자 주식회사|Mobile terminal and method for controlling the same| KR102361885B1|2017-03-28|2022-02-11|삼성전자주식회사|Electronic apparatus and controlling method thereof| DK201770719A1|2017-06-04|2019-02-01|Apple Inc.|User interface camera effects| EP3584717A1|2017-08-01|2019-12-25|Samsung Electronics Co., Ltd.|Electronic device and method for providing search result thereof| CN107563316A|2017-08-22|2018-01-09|努比亚技术有限公司|A kind of image pickup method, terminal and computer-readable recording medium| CN111201772A|2017-10-09|2020-05-26|深圳传音通讯有限公司|Video recording method, device and terminal| CN107800878A|2017-10-19|2018-03-13|广东欧珀移动通信有限公司|image display method and device| US11112964B2|2018-02-09|2021-09-07|Apple Inc.|Media capture lock affordance for graphical user interface| JP2019168940A|2018-03-23|2019-10-03|キヤノン株式会社|Electronic apparatus and control method thereof| US10375313B1|2018-05-07|2019-08-06|Apple Inc.|Creative camera| JP6703057B2|2018-08-31|2020-06-03|キヤノン株式会社|Electronic device, control method thereof, and program thereof| US11128792B2|2018-09-28|2021-09-21|Apple Inc.|Capturing and displaying images with multiple focal planes| CN111124236A|2018-10-30|2020-05-08|阿里巴巴集团控股有限公司|Data processing method, device and machine readable medium| US10645294B1|2019-05-06|2020-05-05|Apple Inc.|User interfaces for capturing and managing visual media| KR20210046323A|2019-10-18|2021-04-28|엘지전자 주식회사|Mobile terminal and assistance device attached to the same| US11212449B1|2020-09-25|2021-12-28|Apple Inc.|User interfaces for media capture and management|
法律状态:
2016-05-30| PLFP| Fee payment|Year of fee payment: 2 | 2017-05-30| PLFP| Fee payment|Year of fee payment: 3 | 2018-05-29| PLFP| Fee payment|Year of fee payment: 4 | 2019-02-01| PLSC| Search report ready|Effective date: 20190201 | 2019-03-27| PLFP| Fee payment|Year of fee payment: 5 | 2020-04-08| PLFP| Fee payment|Year of fee payment: 6 | 2021-04-09| PLFP| Fee payment|Year of fee payment: 7 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 KR1020140059902|2014-05-19| KR1020140059902A|KR102158214B1|2014-05-19|2014-05-19|Mobile terminal and control method for the mobile terminal| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|